|
In statistics, generalized least squares (GLS) is a technique for estimating the unknown parameters in a linear regression model. GLS can be used to perform linear regression when there is a certain degree of correlation between the explanatory variables (independent variables) of the regression. In these cases, ordinary least squares and weighted least squares can be statistically inefficient, or even give misleading inferences. GLS was first described by Alexander Aitken in 1934. == Method outline == In a typical linear regression model we observe data on ''n'' statistical units. The response values are placed in a vector ''Y'' = (''y''1, ..., ''y''''n'')′, and the predictor values are placed in the design matrix ''X'' = ''x''''ij'', where ''x''''ij'' is the value of the ''j''th predictor variable for the ''i''th unit. The model assumes that the conditional mean of ''Y'' given ''X'' is a linear function of ''X'', whereas the conditional variance of the error term given ''X'' is a ''known'' matrix Ω. This is usually written as : Here ''β'' is a vector of unknown “regression coefficients” that must be estimated from the data. Suppose ''b'' is a candidate estimate for ''β''. Then the residual vector for ''b'' will be ''Y'' − ''Xb''. Generalized least squares method estimates ''β'' by minimizing the squared Mahalanobis length of this residual vector: : Since the objective is a quadratic form in ''b'', the estimator has an explicit formula: : 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「generalized least squares」の詳細全文を読む スポンサード リンク
|